# Numerical Reasoning

Matcha Base
Apache-2.0
MatCha is a vision-language model focused on chart understanding and mathematical reasoning, enhancing processing capabilities through joint modeling of charts and language data
Text-to-Image Transformers Supports Multiple Languages
M
google
2,445
26
Matcha Chartqa
Apache-2.0
MatCha is a pre-trained model that enhances the ability of vision-language models to process chart and language data, excelling in chart question answering tasks
Text-to-Image Transformers Supports Multiple Languages
M
google
1,060
41
Matcha Chart2text Pew
Apache-2.0
MatCha is a vision-language model based on the Pix2Struct architecture, specifically optimized for chart comprehension and numerical reasoning tasks, excelling in chart-based question answering.
Image-to-Text Transformers Supports Multiple Languages
M
google
168
39
Tapas Temporary Repo
Apache-2.0
TAPAS is a table-based question answering model that handles conversational QA tasks on tabular data through pre-training and fine-tuning.
Question Answering System Transformers English
T
lysandre
3,443
0
Tapas Large Finetuned Wtq
Apache-2.0
TAPAS is a table question answering model based on the BERT architecture, pre-trained in a self-supervised manner on Wikipedia table data, supporting natural language question answering on table content
Question Answering System Transformers English
T
google
124.85k
141
Tapas Small Finetuned Sqa
Apache-2.0
This model is a small version of TAPAS, which has undergone intermediate pre-training and fine-tuning on the SQA dataset. It is suitable for table question answering tasks in dialogue scenarios.
Question Answering System Transformers English
T
google
759
1
Tapas Small
Apache-2.0
TAPAS is a Transformer-based table question answering model pre-trained in a self-supervised manner on Wikipedia tables and associated text, supporting table understanding and question answering tasks.
Large Language Model Transformers English
T
google
41
0
Tapas Mini
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data and related text, pretrained in a self-supervised manner on Wikipedia table data.
Large Language Model Transformers English
T
google
15
0
Tapas Medium Finetuned Wikisql Supervised
Apache-2.0
TAPAS is a Transformer-based table question answering model, pre-trained in a self-supervised manner on English Wikipedia table data and fine-tuned with supervision on the WikiSQL dataset.
Question Answering System Transformers English
T
google
19
0
Tapas Base Finetuned Wikisql Supervised
Apache-2.0
TAPAS is a BERT-based Transformer model specifically designed for table question answering tasks. It is pre-trained in a self-supervised manner on English Wikipedia table data and supports weakly supervised table parsing.
Question Answering System Transformers English
T
google
737
9
Tapas Base
Apache-2.0
A table understanding model based on BERT architecture, pretrained on Wikipedia table data through self-supervised learning, supporting table question answering and statement verification tasks
Large Language Model Transformers English
T
google
2,457
7
Tapas Base Finetuned Sqa
Apache-2.0
A table question answering model based on BERT architecture, enhanced with intermediate pretraining for numerical reasoning, fine-tuned on the SQA dataset
Question Answering System Transformers English
T
google
1,867
6
Tapas Large Finetuned Wikisql Supervised
Apache-2.0
TAPAS is a BERT-like Transformer model designed for table-based question answering tasks. It is pre-trained in a self-supervised manner on English Wikipedia table corpora and fine-tuned on the WikiSQL dataset.
Question Answering System Transformers English
T
google
80
6
Tapas Medium
Apache-2.0
A table-based question answering model based on the Transformer architecture, pretrained in a self-supervised manner on English Wikipedia tables and associated text
Large Language Model Transformers English
T
google
23
0
Tapas Large Finetuned Sqa
Apache-2.0
This model is the large version of TAPAS, fine-tuned for sequential question answering (SQA) tasks, suitable for table-related question answering scenarios.
Question Answering System Transformers English
T
google
71
7
Tapas Mini Finetuned Sqa
Apache-2.0
The TAPAS mini model is a table question answering model that underwent intermediate pretraining and fine-tuning on the SQA dataset, utilizing relative position embedding technology.
Question Answering System Transformers English
T
google
24
4
Tapas Small Finetuned Wtq
Apache-2.0
This model is a small version of TAPAS, specifically fine-tuned on the WikiTable Questions dataset for table-based question answering tasks.
Question Answering System Transformers English
T
google
406
5
Tapas Medium Finetuned Wtq
Apache-2.0
This model is a medium-sized table question answering model based on TAPAS architecture, fine-tuned on WikiTable Questions dataset, suitable for table data QA tasks.
Question Answering System Transformers English
T
google
77
2
Tapas Tiny Finetuned Wtq
Apache-2.0
TAPAS is a tiny Transformer model optimized for table question answering tasks, achieving table comprehension capabilities through intermediate pretraining and chained multi-dataset fine-tuning
Question Answering System Transformers English
T
google
1,894
1
Tapas Large
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data and related text. It is pre-trained through self-supervised learning on a massive collection of English Wikipedia tables and associated text.
Large Language Model Transformers English
T
google
211
2
Tapas Mini Finetuned Wtq
Apache-2.0
This model is a mini version based on the TAPAS architecture, specifically fine-tuned for the WikiTable Questions (WTQ) dataset for table question answering tasks.
Question Answering System Transformers English
T
google
35
2
Tapas Base Finetuned Wtq
Apache-2.0
TAPAS is a Transformer-based table question answering model, pre-trained on Wikipedia table data through self-supervised learning and fine-tuned on datasets like WTQ.
Question Answering System Transformers English
T
google
23.03k
217
Tapas Tiny Finetuned Sqa
Apache-2.0
TAPAS is a QA model for tabular data. This tiny version is fine-tuned on the SQA dataset, suitable for table-based QA tasks in conversational scenarios.
Question Answering System Transformers English
T
google
2,391
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase